314 research outputs found

    The Digital Foundation Platform -- A Multi-layered SOA Architecture for Intelligent Connected Vehicle Operating System

    Full text link
    Legacy AD/ADAS development from OEMs centers around developing functions on ECUs using services provided by AUTOSAR Classic Platform (CP) to meet automotive-grade and mass-production requirements. The AUTOSAR CP couples hardware and software components statically and encounters challenges to provide sufficient capacities for the processing of high-level intelligent driving functions, whereas the new platform, AUTOSAR Adaptive Platform (AP) is designed to support dynamically communication and provide richer services and function abstractions for those resource-intensive (memory, CPU) applications. Yet for both platforms, application development and the supporting system software are still closely coupled together, and this makes application development and the enhancement less scalable and flexible, resulting in longer development cycles and slower time-to-market. This paper presents a multi-layered, service-oriented intelligent driving operating system foundation (we named it as Digital Foundation Platform) that provides abstractions for easier adoption of heterogeneous computing hardware. It features a multi-layer SOA software architecture with each layer providing adaptive service API at north-bound for application developers. The proposed Digital Foundation Platform (DFP) has significant advantages of decoupling hardware, operating system core, middle-ware, functional software and application software development. It provides SOA at multiple layers and enables application developers from OEMs, to customize and develop new applications or enhance existing applications with new features, either in autonomous domain or intelligent cockpit domain, with great agility, and less code through re-usability, and thus reduce the time-to-market.Comment: WCX SAE World Congress Experience 202

    Shortest Path and Distance Queries on Road Networks: An Experimental Evaluation

    Full text link
    Computing the shortest path between two given locations in a road network is an important problem that finds applications in various map services and commercial navigation products. The state-of-the-art solutions for the problem can be divided into two categories: spatial-coherence-based methods and vertex-importance-based approaches. The two categories of techniques, however, have not been compared systematically under the same experimental framework, as they were developed from two independent lines of research that do not refer to each other. This renders it difficult for a practitioner to decide which technique should be adopted for a specific application. Furthermore, the experimental evaluation of the existing techniques, as presented in previous work, falls short in several aspects. Some methods were tested only on small road networks with up to one hundred thousand vertices; some approaches were evaluated using distance queries (instead of shortest path queries), namely, queries that ask only for the length of the shortest path; a state-of-the-art technique was examined based on a faulty implementation that led to incorrect query results. To address the above issues, this paper presents a comprehensive comparison of the most advanced spatial-coherence-based and vertex-importance-based approaches. Using a variety of real road networks with up to twenty million vertices, we evaluated each technique in terms of its preprocessing time, space consumption, and query efficiency (for both shortest path and distance queries). Our experimental results reveal the characteristics of different techniques, based on which we provide guidelines on selecting appropriate methods for various scenarios.Comment: VLDB201

    Family language policy and planning in China::the changing langscape

    Get PDF

    Cardiac sodium channel palmitoylation regulates channel availability and myocyte excitability with implications for arrhythmia generation

    Get PDF
    Cardiac voltage-gated sodium channels (Nav1.5) play an essential role in regulating cardiac electric activity by initiating and propagating action potentials in the heart. Altered Nav1.5 function is associated with multiple cardiac diseases including long-QT3 and Brugada syndrome. Here, we show that Nav1.5 is subject to palmitoylation, a reversible post-translational lipid modification. Palmitoylation increases channel availability and late sodium current activity, leading to enhanced cardiac excitability and prolonged action potential duration. In contrast, blocking palmitoylation increases closed-state channel inactivation and reduces myocyte excitability. We identify four cysteines as possible Nav1.5 palmitoylation substrates. A mutation of one of these is associated with cardiac arrhythmia (C981F), induces a significant enhancement of channel closed-state inactivation and ablates sensitivity to depalmitoylation. Our data indicate that alterations in palmitoylation can substantially control Nav1.5 function and cardiac excitability and this form of post-translational modification is likely an important contributor to acquired and congenital arrhythmias

    Characterisation of ball impact conditions in professional tennis: matches played on hard court

    Get PDF
    To assess ball performance for research and development purposes requires greater understanding of the impact conditions a tennis ball experiences in professional tournament play. Ball tracking information taken from three consecutive years of an ATP 250 tour event played on hard court were analysed. The frequency of first serves, second serves, racket impacts and surface impacts were assessed per game and extrapolated to show how many impacts a single ball is subjected to. Where applicable the pre- and post-impact velocity and angle were found and the distribution of each analysed. In total, data from 65 matches comprising 1,505 games were analysed. On average, each game contained 70.26 (± 16.23) impacts, of which 9.23%, 3.16%, 37.78% and 49.83% were first serves, second serves, racket impacts and surface impacts respectively. As a result, assuming all balls in play are used evenly, a single ball is expected to be subjected to 105 (± 24) impacts over the course of the nine games that it is in play. The results of the investigation could be used to design a wear protocol capable of artificially wearing tennis balls in a way that is representative of professional play

    Characterisation of ball degradation events in professional tennis

    Get PDF
    Tennis balls are acknowledged to degrade with use and are replaced at regular intervals during professional matches to maintain consistency and uniformity in performance, such that the game is not adversely affected. Balls are subject to the international tennis federation’s (ITF) ball approval process, which includes a degradation test to ensure a minimum standard of performance. The aim of this investigation was to establish if the ITF degradation test can assess ball longevity and rate of degradation and determine if there is a need for a new degradation test that is more representative of in-play conditions. Ball tracking data from four different professional events, spanning the three major court surfaces, including both men’s and women’s matches were analysed. The frequency of first serves, second serves, racket impacts and surface impacts were assessed and the corresponding distribution of ball speed and (for surface impacts) impact angle was determined. Comparison of ball impact frequency and conditions between in-play data and the ITF degradation test indicated the development of a new test, more representative of in-play data, would be advantageous in determining ball longevity and rate of degradation with use. Assessment of data from different surfaces highlighted that grass court subjected the ball to fewer racket and surface impacts than hard court or clay. In turn, this appears to influence the distribution of ball speed on impact with the surface or racket, suggesting a surface-specific degradation test may be beneficial. As a result of these findings a new test protocol has been proposed, utilising the in-play data, to define the frequency of impacts and impact conditions to equate to nine games of professional tennis across the different surfaces

    Measurement of strain and strain rate during the impact of tennis ball cores

    Get PDF
    The aim of this investigation was to establish the strains and strain rates experienced by tennis ball cores during impact to inform material characterisation testing and finite element modelling. Three-dimensional surface strains and strain rates were measured using two high-speed video cameras and corresponding digital image correlation software (GOM Correlate Professional). The results suggest that material characterisation testing to a maximum strain of 0.4 and a maximum rate of 500 s-1 in tension and to a maximum strain of -0.4 and a maximum rate of -800 s-1 in compression would encapsulate the demands placed on the material during impact and, in turn, define the range of properties required to encapsulate the behavior of the material during impact, enabling testing to be application-specific and strain-rate-dependent properties to be established and incorporated in finite element models

    A decade of change in breastfeeding in China's far north-west

    Get PDF
    BACKGROUND: There have been considerable changes in breastfeeding practices in China over the past forty years. However China is a very large country, and breastfeeding rates in different parts of China vary considerably. The objective of this paper is to identify and compare breastfeeding types and rates between 1994–1996 and 2003–2004 in Shihezi, Xinjiang Uygur Autonomous Region, PR China. METHODS: In 1994–1996, a study of breastfeeding (n = 2197) was undertaken in Shihezi, Xinjiang, PR China. A decade later in 2003–2004, a longitudinal study (n = 545) of infant feeding practices was undertaken in the same area. RESULTS: The 'any breastfeeding' rates at 1, 4 and 6 months were 94%, 82% and 78% respectively in the early 1990s. A decade later, breastfeeding at 1 month was lower, but rates at 4 and 6 months remained the same. In 2004 the 'full breastfeeding' rate at one month was significantly higher (57%) than a decade earlier (38%), but after 3 months there was a rapid decline. This reflected a shift in the way complementary foods are introduced: the initial introduction was later, but by a higher proportion of mothers. CONCLUSION: The rate of breastfeeding at one month is significantly lower in 2003–2004 when compared to 1994–1996. The 'full breastfeeding' rates were initially higher, but after 3 months were then lower. The Chinese national breastfeeding targets were not reached in either period of the study. These studies show the need to further promote full or exclusive breastfeeding and further longitudinal studies are necessary to provide the detailed knowledge about risk factors required for health promotion programs
    • …
    corecore